Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Neural Netw ; 159: 125-136, 2023 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-36565690

RESUMO

Artificial neural networks (ANNs) have been widely adopted as general computational tools both in computer science as well as many other engineering fields. Stochastic gradient descent (SGD) and adaptive methods such as Adam are popular as robust optimization algorithms used to train the ANNs. However, the effectiveness of these algorithms is limited because they calculate a search direction based on a first-order gradient. Although higher-order gradient methods such as Newton's method have been proposed, they require the Hessian matrix to be semi-definite, and its inversion incurs a high computational cost. Therefore, in this paper, we propose a variable three-term conjugate gradient (VTTCG) method that approximates the Hessian matrix to enhance search direction and uses a variable step size to achieve improved convergence stability. To evaluate the performance of the VTTCG method, we train different ANNs on benchmark image classification and generation datasets. We also conduct a similar experiment in which a grasp generation and selection convolutional neural network (GGS-CNN) is trained to perform intelligent robotic grasping. After considering a simulated environment, we also test the GGS-CNN with a physical grasping robot. The experimental results show that the performance of the VTTCG method is superior to that of four conventional methods, including SGD, Adam, AMSGrad, and AdaBelief.


Assuntos
Redes Neurais de Computação , Robótica , Algoritmos , Benchmarking
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...